47 research outputs found

    Early Detection of Online Auction Opportunistic Sellers Through the Use of Negative-Positive Feedback

    Get PDF
    Apparently fraud is a growth industry. The monetary losses from Internet fraud have increased every year since first officially reported by the Internet Crime Complaint Center (IC3) in 2000. Prior research studies and third-party reports of fraud show rates substantially higher than eBay’s reported negative feedback rate of less than 1%. The conclusion is most buyers are withholding reports of negative feedback. Researchers Nikitov and Stone in a forensic case study of a single opportunistic eBay seller found buyers sometimes embedded negative comments in positive feedback as a means of avoiding retaliation from sellers and damage to their reputation. This category of positive feedback was described as “negative-positive” feedback. An example of negative-positive type feedback is “Good product, but slow shipping.” This research study investigated the concept of using negative-positive type feedback as a signature to identify potential opportunistic sellers in an online auction population. As experienced by prior researchers using data extracted from the eBay web site, the magnitude of data to be analyzed in the proposed study was massive. The nature of the analysis required - judgment of seller behavior and contextual analysis of buyer feedback comments – could not be automated. The traditional method of using multiple dedicated human raters would have taken months of labor with a correspondingly high labor cost. Instead, crowdsourcing in the form of Amazon Mechanical Turk was used to reduce the analysis time to a few days and at a fraction of the traditional labor cost. The research’s results found that the presence of subtle buyer behavior in the form of negative-positive type feedback comments are an inter-buyer signal indicating that a seller was behaving fraudulently. Sellers with negative-positive type feedback were 1.82 times more likely to be fraudulent. A correlation exists between an increasing number of negative-positive type feedback comments and an increasing probability that a seller was acting fraudulently. For every one unit increase in the number of negative-positive type feedback comments a seller was 4% more likely to be fraudulent

    Care, laboratory beagles and affective utopia

    Get PDF
    A caring approach to knowledge production has been portrayed as epistemologically radical, ethically vital and as fostering continuous responsibility between researchers and research-subjects. This article examines these arguments through focusing on the ambivalent role of care within the first large-scale experimental beagle colony, a self-professed ‘beagle utopia’ at the University of California, Davis, (1951-1986). We argue that care was at the core of the beagle colony; the lived environment was re-shaped in response to animals ‘speaking back’ to researchers, and ‘love’ and ‘kindness’ were important considerations during staff recruitment. Ultimately, however, we show that care-relations were used to manufacture compliancy, preventing the predetermined ends of the experiment from being troubled. Rather than suggesting Davis would have been less ethically troubling, or more epistemologically radical, with ‘better’ care, however, we suggest the case troubles existing care theory and argue that greater attention needs to be paid to histories, contexts, and exclusions

    Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.). Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models.</p> <p>Results</p> <p>The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence.</p> <p>Conclusions</p> <p>Our algorithms prove to be effective and able to handle real data sets with multiple sequences, as well as biological patterns of interest, even when the latter display a high complexity (PROSITE signatures for example). In addition, these exact algorithms allow us to avoid the edge effect observed under the single sequence approximation, which leads to erroneous results, especially when the marginal distribution of the model displays a slow convergence toward the stationary distribution. We end up with a discussion on our method and on its potential improvements.</p

    Energy Estimation of Cosmic Rays with the Engineering Radio Array of the Pierre Auger Observatory

    Full text link
    The Auger Engineering Radio Array (AERA) is part of the Pierre Auger Observatory and is used to detect the radio emission of cosmic-ray air showers. These observations are compared to the data of the surface detector stations of the Observatory, which provide well-calibrated information on the cosmic-ray energies and arrival directions. The response of the radio stations in the 30 to 80 MHz regime has been thoroughly calibrated to enable the reconstruction of the incoming electric field. For the latter, the energy deposit per area is determined from the radio pulses at each observer position and is interpolated using a two-dimensional function that takes into account signal asymmetries due to interference between the geomagnetic and charge-excess emission components. The spatial integral over the signal distribution gives a direct measurement of the energy transferred from the primary cosmic ray into radio emission in the AERA frequency range. We measure 15.8 MeV of radiation energy for a 1 EeV air shower arriving perpendicularly to the geomagnetic field. This radiation energy -- corrected for geometrical effects -- is used as a cosmic-ray energy estimator. Performing an absolute energy calibration against the surface-detector information, we observe that this radio-energy estimator scales quadratically with the cosmic-ray energy as expected for coherent emission. We find an energy resolution of the radio reconstruction of 22% for the data set and 17% for a high-quality subset containing only events with at least five radio stations with signal.Comment: Replaced with published version. Added journal reference and DO

    Measurement of the Radiation Energy in the Radio Signal of Extensive Air Showers as a Universal Estimator of Cosmic-Ray Energy

    Full text link
    We measure the energy emitted by extensive air showers in the form of radio emission in the frequency range from 30 to 80 MHz. Exploiting the accurate energy scale of the Pierre Auger Observatory, we obtain a radiation energy of 15.8 \pm 0.7 (stat) \pm 6.7 (sys) MeV for cosmic rays with an energy of 1 EeV arriving perpendicularly to a geomagnetic field of 0.24 G, scaling quadratically with the cosmic-ray energy. A comparison with predictions from state-of-the-art first-principle calculations shows agreement with our measurement. The radiation energy provides direct access to the calorimetric energy in the electromagnetic cascade of extensive air showers. Comparison with our result thus allows the direct calibration of any cosmic-ray radio detector against the well-established energy scale of the Pierre Auger Observatory.Comment: Replaced with published version. Added journal reference and DOI. Supplemental material in the ancillary file

    Automated, compliant, high-flow common carotid to middle cerebral artery bypass

    No full text
    The authors describe the use of the Cardica C-Port xA Distal Anastomosis System to perform an automated, high-flow extracranial-intracranial bypass. The C-Port system has been developed and tested in coronary artery bypass surgery for rapid distal coronary artery anastomoses. Air-powered, it performs an automated end-to-side anastomosis within seconds by nearly simultaneously making an arteriotomy and inserting 13 microclips into the graft and recipient vessel. Intracranial use of the device was first simulated in a cadaver prepared for microsurgical anatomical dissection. The authors used this system in a 43-year-old man who sustained a subarachnoid hemorrhage after being assaulted and was found to have a traumatic pseudoaneurysm of the proximal intracranial internal carotid artery. The aneurysm appeared to be enlarging on serial imaging studies and it was anticipated that a bypass would probably be needed to treat the lesion. An end-to-side bypass was performed with the C-Port system using a saphenous vein conduit extending from the common carotid artery to the middle cerebral artery. The bypass was demonstrated to be patent on intraoperative and postoperative arteriography. The patient had a temporary hyperperfusion syndrome and subsequently made a good neurological recovery. The C-Port system facilitates the performance of a high-flow extracranial-intracranial bypass with short periods of temporary arterial occlusion. Because of the size and configuration of the device, its use is not feasible in all anatomical situations that require a high-flow bypass; however it is a useful addition to the armamentarium of the neurovascular surgeon
    corecore